Seymour
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- North America > United States > Indiana > Jackson County > Seymour (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- North America > United States > Indiana > Jackson County > Seymour (0.04)
Structured d-DNNF Is Not Closed Under Negation
Both structured d-DNNF and SDD can be exponentially more succinct than OBDD. Moreover, SDD is essentially as tractable as OBDD. But this has left two important open questions. Firstly, does OBDD support more tractable transformations than structured d-DNNF? And secondly, is structured d-DNNF more succinct than SDD? In this paper, we answer both questions in the affirmative. For the first question we show that, unlike OBDD, structured d-DNNF does not support polytime negation, disjunction, or existential quantification operations. As a corollary, we deduce that there are functions with an equivalent polynomial-sized structured d-DNNF but with no such representation as an SDD, thus answering the second question. We also lift this second result to arithmetic circuits (AC) to show a succinctness gap between PSDD and the monotone AC analogue to structured d-DNNF.
- Europe > Spain > Catalonia > Barcelona Province > Barcelona (0.14)
- Europe > Austria > Vienna (0.14)
- North America > United States > Illinois > Cook County > Chicago (0.04)
- (11 more...)
Multiclass Learnability Does Not Imply Sample Compression
A hypothesis class admits a sample compression scheme, if for every sample labeled by a hypothesis from the class, it is possible to retain only a small subsample, using which the labels on the entire sample can be inferred. The size of the compression scheme is an upper bound on the size of the subsample produced. Every learnable binary hypothesis class (which must necessarily have finite VC dimension) admits a sample compression scheme of size only a finite function of its VC dimension, independent of the sample size. For multiclass hypothesis classes, the analog of VC dimension is the DS dimension. We show that the analogous statement pertaining to sample compression is not true for multiclass hypothesis classes: every learnable multiclass hypothesis class, which must necessarily have finite DS dimension, does not admit a sample compression scheme of size only a finite function of its DS dimension.
- North America > United States > Indiana > Jackson County > Seymour (0.04)
- North America > United States > California > Santa Clara County > Palo Alto (0.04)
- Asia > Afghanistan > Parwan Province > Charikar (0.04)
Online Learning and Disambiguations of Partial Concept Classes
Cheung, Tsun-Ming, Hatami, Hamed, Hatami, Pooya, Hosseini, Kaave
In a recent article, Alon, Hanneke, Holzman, and Moran (FOCS '21) introduced a unifying framework to study the learnability of classes of partial concepts. One of the central questions studied in their work is whether the learnability of a partial concept class is always inherited from the learnability of some ``extension'' of it to a total concept class. They showed this is not the case for PAC learning but left the problem open for the stronger notion of online learnability. We resolve this problem by constructing a class of partial concepts that is online learnable, but no extension of it to a class of total concepts is online learnable (or even PAC learnable).
- North America > Canada > Quebec > Montreal (0.04)
- North America > United States > Ohio (0.04)
- North America > United States > New York (0.04)
- North America > United States > Indiana > Jackson County > Seymour (0.04)
Multiclass Learnability Beyond the PAC Framework: Universal Rates and Partial Concept Classes
Kalavasis, Alkis, Velegkas, Grigoris, Karbasi, Amin
In this paper we study the problem of multiclass classification with a bounded number of different labels $k$, in the realizable setting. We extend the traditional PAC model to a) distribution-dependent learning rates, and b) learning rates under data-dependent assumptions. First, we consider the universal learning setting (Bousquet, Hanneke, Moran, van Handel and Yehudayoff, STOC '21), for which we provide a complete characterization of the achievable learning rates that holds for every fixed distribution. In particular, we show the following trichotomy: for any concept class, the optimal learning rate is either exponential, linear or arbitrarily slow. Additionally, we provide complexity measures of the underlying hypothesis class that characterize when these rates occur. Second, we consider the problem of multiclass classification with structured data (such as data lying on a low dimensional manifold or satisfying margin conditions), a setting which is captured by partial concept classes (Alon, Hanneke, Holzman and Moran, FOCS '21). Partial concepts are functions that can be undefined in certain parts of the input space. We extend the traditional PAC learnability of total concept classes to partial concept classes in the multiclass setting and investigate differences between partial and total concepts.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- North America > United States > Indiana > Jackson County > Seymour (0.04)
Artificial intelligence
AI as it's called, is becoming increasingly popular (or unpopular, depending on your view). In 1951, author Arthur C. Clarke published a series of science fiction short stories, including the one he collaborated on 17 years later with movie writer/producer/director Stanley Kubrick, birthing the historic film, "2001: A Space Odyssey." Among other things, Odyssey explored the result of humans interacting with a computer that begins to think like them, and (HAL 9000) takes on a mind of his/its own. When Wozniak and Jobs created Apple, the goal was to get computers to think like man, so they could readily understand each other. That's why the trash icon looks like a garbage receptacle -- "Getting rid of garbage? Throw it in the can."
- Oceania > Guam (0.05)
- North America > United States > Indiana > Jackson County > Seymour (0.05)
- North America > United States > Illinois > Cook County > Chicago (0.05)
TE-ETH: Lower Bounds for QBFs of Bounded Treewidth
Fichte, Johannes Klaus, Hecher, Markus, Pfandler, Andreas
The problem of deciding the validity (QSAT) of quantified Boolean formulas (QBF) is a vivid research area in both theory and practice. In the field of parameterized algorithmics, the well-studied graph measure treewidth turned out to be a successful parameter. A well-known result by Chen in parameterized complexity is that QSAT when parameterized by the treewidth of the primal graph of the input formula together with the quantifier depth of the formula is fixed-parameter tractable. More precisely, the runtime of such an algorithm is polynomial in the formula size and exponential in the treewidth, where the exponential function in the treewidth is a tower, whose height is the quantifier depth. A natural question is whether one can significantly improve these results and decrease the tower while assuming the Exponential Time Hypothesis (ETH). In the last years, there has been a growing interest in the quest of establishing lower bounds under ETH, showing mostly problem-specific lower bounds up to the third level of the polynomial hierarchy. Still, an important question is to settle this as general as possible and to cover the whole polynomial hierarchy. In this work, we show lower bounds based on the ETH for arbitrary QBFs parameterized by treewidth (and quantifier depth). More formally, we establish lower bounds for QSAT and treewidth, namely, that under ETH there cannot be an algorithm that solves QSAT of quantifier depth i in runtime significantly better than i-fold exponential in the treewidth and polynomial in the input size. In doing so, we provide a versatile reduction technique to compress treewidth that encodes the essence of dynamic programming on arbitrary tree decompositions. Further, we describe a general methodology for a more fine-grained analysis of problems parameterized by treewidth that are at higher levels of the polynomial hierarchy.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- North America > United States > Indiana > Jackson County > Seymour (0.04)
- Europe > Germany > Saxony > Dresden (0.04)
- (3 more...)